Goto

Collaborating Authors

 input data





DeepDiffusion-Invariant WassersteinDistributionalClassification

Neural Information Processing Systems

How can the stochastic properties of input data and labels be appropriately captured to handle severe perturbations? To answer this question, we represent both input data and target labels as probability measures (i.e., probability densities), denoted asµn and ˆνn, respectively, in the Wasserstein space and solve a distance-based classification problem (i.e.,



8710ef761bbb29a6f9d12e4ef8e4379c-Paper.pdf

Neural Information Processing Systems

In machine learning, these have a know-it-when-you-see-it character; e.g., changing the gender of a sentence's subject changes a sentiment predictor's output. To check for spurious correlations, we can'stress test' models by perturbing irrelevant parts of input data and seeing if model predictions change. In this paper, we study stress testing using the tools of causal inference. We introduce counterfactual invariance as a formalization of the requirement that changing irrelevant parts of the input shouldn'tchangemodelpredictions.




0d9057d84a9fc37523bf826232ea6820-Paper-Conference.pdf

Neural Information Processing Systems

In the case of coupled skew tent maps, theproposedmethodconsistently outperforms afivelayerDeepNeuralNetwork (DNN) and Long Short Term Memory (LSTM) architecture for unidirectional coupling coefficient values ranging from0.1 to 0.7.